Wagging: A learning approach which allows single layer perceptrons to outperform more complex learning algorithms

نویسندگان

  • Tim Andersen
  • Tony Martinez
چکیده

Despite the well known inability of single layer perceptrons (SLPs) to learn arbitrary functions, SLPs often exhibit reasonable generalization performance on many problems of interest. However, because of the limitations of SLPs very little effort has been made to improve their performance. In this paper we examine a method for improving the performance of SLPs which we call "wagging" (weight averaging). This method involves training several different SLPs on the same training data, and then averaging their weights to obtain a single SLP. We compare the performance of the wagged SLP with other more complex learning algorithms on several data sets from real world problem domains. Surprisingly, the wagged SLP has better average generalization performance than any of the other learning algorithms on the problems tested. We provide analysis and explanations for this result. The analysis includes looking at the performance characteristics of the standard delta rule training algorithm for SLPs and the correlation between training and test set scores as training progresses.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

1 The Little Neuron that Could

SLPs (single layer perceptrons) often exhibit reasonable generalization performance on many problems of interest. However, due to the well known limitations of SLPs very little effort has been made to improve their performance. This paper proposes a method for improving the performance of SLPs called "wagging" (weight averaging). This method involves training several different SLPs on the same ...

متن کامل

Application of ensemble learning techniques to model the atmospheric concentration of SO2

In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...

متن کامل

Multi-Layer Perceptrons with B-Spline Receptive Field Functions

Multi-layer perceptrons are often slow to learn nonlinear functions with complex local structure due to the global nature of their function approximations. It is shown that standard multi-layer perceptrons are actually a special case of a more general network formulation that incorporates B-splines into the node computations. This allows novel spline network architectures to be developed that c...

متن کامل

یادگیری نیمه نظارتی کرنل مرکب با استفاده از تکنیک‌های یادگیری معیار فاصله

Distance metric has a key role in many machine learning and computer vision algorithms so that choosing an appropriate distance metric has a direct effect on the performance of such algorithms. Recently, distance metric learning using labeled data or other available supervisory information has become a very active research area in machine learning applications. Studies in this area have shown t...

متن کامل

Advanced Supervised Learning in Multi - layer Perceptrons - From

Computer Standards and Interfaces Special Issue on Neural Networks (5), 1994 Advanced Supervised Learning in Multi-layer Perceptrons From Backpropagation to Adaptive Learning Algorithms Martin Riedmiller Institut f ur Logik, Komplexit at und Deduktionssyteme University of Karlsruhe W-76128 Karlsruhe FRG [email protected] Abstract| Since the presentation of the backpropagation algorithm [1] a ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004